# Lightweight Llama

Microllama
Apache-2.0
MicroLlama is a 300-million-parameter Llama model pretrained by individual developer keeeeenw within a $500 budget, focusing on English text generation tasks.
Large Language Model Transformers English
M
keeeeenw
2,955
46
Tinyllama V1.1 Chinese
Apache-2.0
TinyLlama is a 1.1-billion-parameter small language model that adopts the same architecture and tokenizer as Llama 2, suitable for resource-constrained application scenarios.
Large Language Model Transformers English
T
TinyLlama
447
9
Tinyllama V1.1 Math Code
Apache-2.0
TinyLlama is a compact language model with 1.1 billion parameters, adopting the same architecture and tokenizer as Llama 2, suitable for applications with limited computational and memory resources.
Large Language Model Transformers English
T
TinyLlama
3,436
11
Tinyllama V1.1
Apache-2.0
TinyLlama is a small language model with 1.1 billion parameters, adopting the same architecture and tokenizer as Llama 2, suitable for resource-constrained application scenarios.
Large Language Model Transformers English
T
TinyLlama
42.11k
92
Phalanx 512x460M MoE
Apache-2.0
LiteLlama-460M-1T is a lightweight mixture of experts model with 512 experts, suitable for efficient inference and text generation tasks.
Large Language Model Transformers English
P
Kquant03
28
2
Tinyllama 1.1B Medical
A medical domain Q&A model fine-tuned based on TinyLlama-1.1B-Chat-v1.0, optimized for medical texts and Q&A scenarios
Large Language Model
T
therealcyberlord
83
2
Tinyllama 1.1B Intermediate Step 1431k 3T
Apache-2.0
TinyLlama is a 1.1B parameter Llama model pretrained on 3 trillion tokens, designed to provide compact and efficient text generation capabilities.
Large Language Model Transformers English
T
TinyLlama
25.04k
173
Tinyllama 1.1B Intermediate Step 1195k Token 2.5T
Apache-2.0
TinyLlama is a compact 1.1B-parameter Llama model pretrained on 3 trillion tokens, designed for resource-constrained environments.
Large Language Model Transformers English
T
TinyLlama
419
52
Tinyllama 1.1B Chat V0.6
Apache-2.0
TinyLlama is a 1.1 billion parameter Llama model pre-trained on 3 trillion tokens, suitable for scenarios with limited computation and memory resources.
Large Language Model English
T
TinyLlama
11.60k
98
Tinyllama 1.1B Step 50K 105b
Apache-2.0
TinyLlama is a 1.1B parameter Llama model, planned to be pretrained on 3 trillion tokens, optimized to complete training in 90 days on 16 A100-40G GPUs.
Large Language Model Transformers English
T
TinyLlama
14.41k
133
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase